Current Issue : July - September Volume : 2015 Issue Number : 3 Articles : 5 Articles
A unit is said to be randomly censored when the information on time occurrence of an event is not available due to either\nloss to followup, withdrawal, or nonoccurrence of the outcome event before the end of the study. It is assumed in independent\nrandom/noninformative censoring that each individual has his/her own failure time T and censoring time C; however, one can only\nobserve the random vector, say, (X; ?).The classical approach is considered for analysing the generalised exponential distribution\nwith random or noninformative censored samples which occur most often in biological or medical studies. The Bayes methods\nare also considered via a numerical approximation suggested by Lindley in 1980 and that of the Laplace approximation procedure\ndeveloped by Tierney and Kadane in 1986 with assumed informative priors alongside linear exponential loss function and squared\nerror loss function. A simulation study is carried out to compare the estimators proposed in this paper. Two datasets have also been\nillustrated....
In the present paperwe exploit the theory of ambit processes to develop amodelwhich is able to effectively forecast prices of forward\ncontracts written on the Italian energy market. Both short-termand medium-termscenarios are considered and proper calibration\nprocedures as well as related numerical results are provided showing a high grade of accuracy in the obtained approximations when\ncompared with empirical time series of interest....
The aim of coal quality control in coal mines is to supply power plants daily with extracted raw material within certain\ncoal quality constraints. On the example of a selected part of a lignite deposit, the problem of quality control for the runof-\nmine lignite stream is discussed. The main goal is to understand potential fluctuations and deviations from production\ntargets dependent on design options before an investment is done. A single quality parameter of the deposit is selected for\nthis analysisââ?¬â?the calorific value of raw lignite. The approach requires an integrated analysis of deposit inherent variability,\nthe extraction sequence, and the blending option during material transportation. Based on drill-hole data models capturing of\nspatial variability of the attribute of consideration are generated. An analysis based on two modelling approaches, Kriging and\nsequential Gaussian simulation, reveals advantages and disadvantages lead to conclusions about their suitability for the control\nof raw material quality. In a second step, based on a production schedule, the variability of the calorific value in the lignite\nstream has been analysed. In a third step the effect of different design options, multiple excavators and a blending bed, was\ninvestigated....
The purpose of the present paper is to assess the efficacy of confidence intervals for Rosenthal�s fail-safe number. Although\nRosenthal�s estimator is highly used by researchers, its statistical properties are largely unexplored. First of all, we developed\nstatistical the ory which allowed us to produce confidence intervals for Rosenthal�s fail-safe number. This was produced by discerning\nwhether the number of studies analysed in a meta-analysis is fixed or random. Each case produces different variance estimators.\nFor a given number of studies and a given distribution, we provided five variance estimators. Confidence intervals are examined\nwith a normal approximation and a non parametric bootstrap. The accuracy of the different confidence interval estimates was then\ntested by methods of simulation under different distributional assumptions. The half normal distribution variance estimator has\nthe best probability coverage. Finally, we provide a table of lower confidence intervals for Rosenthal�s estimator....
We derive some simple relations that demonstrate how the posterior convergence rate is related to two driving factors: a ââ?¬Å?penalized\ndivergenceââ?¬Â of the prior, which measures the ability of the prior distribution to propose a non negligible set of working models to\napproximate the true model and a ââ?¬Å?norm complexityââ?¬Â of the prior, which measures the complexity of the prior support, weighted by\nthe prior probability masses. These formulas are explicit and involve no essential assumptions and are easy to apply.We apply this\napproach to the case with model averaging and derive some useful oracle inequalities that can optimize the performance adaptively\nwithout knowing the true model....
Loading....